Learning Visually Grounded Common Sense Spatial Knowledge for Implicit Spatial Language

نویسندگان

  • Guillem Collell
  • Marie-Francine Moens
چکیده

Spatial understanding is crucial for any agent that navigates in a physical world. Computational and cognitive frameworks often model spatial representations as spatial templates or regions of acceptability for two objects under an explicit spatial preposition such as “left” or “below” (Logan and Sadler 1996). Contrary to previous work that define spatial templates for explicit spatial language only (Malinowski and Fritz 2014; Moratz and Tenbrink 2006), we extend such concept to implicit spatial language, i.e., those relationships (usually actions) that do not explicitly define the relative location of the two objects (e.g., “dog under table”) but only implicitly (e.g., “girl riding horse”). Unlike explicit relationships, predicting spatial arrangements from implicit spatial language requires spatial common sense knowledge about the objects and actions. Furthermore, prior work that leverage common sense spatial knowledge to solve tasks such as visual paraphrasing (Lin and Parikh 2015) or object labeling (Shiang et al. 2017) do not aim to predict (unseen) spatial configurations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Spatial Knowledge for Text to 3D Scene Generation

We address the grounding of natural language to concrete spatial constraints, and inference of implicit pragmatics in 3D environments. We apply our approach to the task of text-to-3D scene generation. We present a representation for common sense spatial knowledge and an approach to extract it from 3D scene data. In text-to3D scene generation, a user provides as input natural language text from ...

متن کامل

Acquiring Common Sense Spatial Knowledge through Implicit Spatial Templates

Spatial understanding is a fundamental problem with widereaching real-world applications. The representation of spatial knowledge is often modeled with spatial templates, i.e., regions of acceptability of two objects under an explicit spatial relationship (e.g., “on”, “below”, etc.). In contrast with prior work that restricts spatial templates to explicit spatial prepositions (e.g., “glass on t...

متن کامل

Study of Numerical Processing Speed, Implicit and Explicit Memory, Active and Passive Memory, Conservation Abilities, and Visual-Spatial Skills of Students with Dyscalculia

Background and Purpose: Learning disorder is one of the common disorders in students, which can lead to the occurrence of educational problems and secondary disorders in them. Based on psychopathological criteria, dyscalculia is one of the subcategories of learning disorder. Children with this disorder have problems in perception of spatial relations and in different cognitive abilities. Theref...

متن کامل

Grounded Models as a Basis for Intuitive and Deductive Reasoning: the Acquisition of Logical Categories

Grounded models differ from axiomatic theories in establishing explicit connections between language and reality that are learned through language games. This paper describes how grounded models are constructed by visually grounded autonomous agents playing different types of language games, and explains how they can be used for intuitive reasoning. It proposes a particular language game that c...

متن کامل

Learning spatial prepositions by Iranian EFL learners

The aim of the present study is threefold. The first is that whether there is any difference betweendifferent proficiency level language learners 'use of spatial prepositions. The Second aim is toreveal that if the native language of the participants has any effect on applying the appropriateprepositions and also to find which spatial preposition is difficult to acquire. The present paperexamin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017